Large-scale l0 sparse inverse covariance estimation

نویسندگان

  • Goran Marjanovic
  • Magnus O. Ulfarsson
  • Victor Solo
چکیده

There has been significant interest in sparse inverse covariance estimation in areas such as statistics, machine learning, and signal processing. In this problem, the sparse inverse of a covariance matrix of a multivariate normal distribution is estimated. A Penalised LogLikelihood (PLL) optimisation problem is solved to obtain the matrix estimator, where the penalty is responsible for inducing sparsity. The most natural sparsity promoting penalty is the non-convex l0 function. Due to speed and memory limitations, the existing algorithms for dealing with the non-convex l0 PLL problem are unable to be used in high dimensional settings. Here we address this issue by presenting a new block iterative approach for this problem, which can handle large-scale data sizes. Simulations demonstrate that our approach outperforms existing methods for this problem.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

0 Sparse Inverse Covariance Estimation

Recently, there has been focus on penalized loglikelihood covariance estimation for sparse inverse covariance (precision) matrices. The penalty is responsible for inducing sparsity, and a very common choice is the convex l1 norm. However, the best estimator performance is not always achieved with this penalty. The most natural sparsity promoting “norm” is the non-convex l0 penalty but its lack ...

متن کامل

A Well-Conditioned and Sparse Estimation of Covariance and Inverse Covariance Matrices Using a Joint Penalty

We develop a method for estimating well-conditioned and sparse covariance and inverse covariance matrices from a sample of vectors drawn from a sub-Gaussian distribution in high dimensional setting. The proposed estimators are obtained by minimizing the quadratic loss function and joint penalty of `1 norm and variance of its eigenvalues. In contrast to some of the existing methods of covariance...

متن کامل

A Block-Coordinate Descent Approach for Large-scale Sparse Inverse Covariance Estimation

The sparse inverse covariance estimation problem arises in many statistical applications in machine learning and signal processing. In this problem, the inverse of a covariance matrix of a multivariate normal distribution is estimated, assuming that it is sparse. An `1 regularized log-determinant optimization problem is typically solved to approximate such matrices. Because of memory limitation...

متن کامل

JPEN Estimation of Covariance and Inverse Covariance Matrix A Well-Conditioned and Sparse Estimation of Covariance and Inverse Covariance Matrices Using a Joint Penalty

We develop a method for estimating well-conditioned and sparse covariance and inverse covariance matrices from a sample of vectors drawn from a sub-gaussian distribution in high dimensional setting. The proposed estimators are obtained by minimizing the quadratic loss function and joint penalty of `1 norm and variance of its eigenvalues. In contrast to some of the existing methods of covariance...

متن کامل

A multilevel framework for sparse optimization with application to inverse covariance estimation and logistic regression

Solving l1 regularized optimization problems is common in the fields of computational biology, signal processing and machine learning. Such l1 regularization is utilized to find sparse minimizers of convex functions. A well-known example is the LASSO problem, where the l1 norm regularizes a quadratic function. A multilevel framework is presented for solving such l1 regularized sparse optimizati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016